Estimation Optimality of Corrected AIC and Modified Cp in Linear Regression
نویسندگان
چکیده
Model selection criteria often arise by constructing unbiased or approximately unbiased estimators of measures known as expected overall discrepancies (Linhart & Zucchini, 1986, p. 19). Such measures quantify the disparity between the true model (i.e., the model which generated the observed data) and a fitted candidate model. For linear regression with normally distributed error terms, the “corrected” Akaike information criterion and the “modified” conceptual predictive statistic have been proposed as exactly unbiased estimators of their respective target discrepancies. We expand on previous work to additionally show that these criteria achieve minimum variance within the class of unbiased estimators.
منابع مشابه
Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models
It has been shown that AIC-type criteria are asymptotically efficient selectors of the tuning parameter in non-concave penalized regression methods under the assumption that the population variance is known or that a consistent estimator is available. We relax this assumption to prove that AIC itself is asymptotically efficient and we study its performance in finite samples. In classical regres...
متن کاملDiscrepancy-based algorithms for best-subset model selection
The selection of a best-subset regression model from a candidate family is a common problem that arises in many analyses. In best-subset model selection, we consider all possible subsets of regressor variables; thus, numerous candidate models may need to be fit and compared. One of the main challenges of best-subset selection arises from the size of the candidate model family: specifically, the...
متن کاملMoment convergence of regularized least-squares estimator for linear regression model
In this paper we study the uniform tail-probability estimates of a regularized leastsquares estimator for the linear regression model, by making use of the polynomial type large deviation inequality for the associated statistical random fields, which may not be locally asymptotically quadratic. Our results provide a measure of rate of consistency in variable selection in sparse estimation, whic...
متن کاملBias of the corrected AIC criterion for underfitted regression and time series models
The Akaike Information Criterion, AIC (Akaike, 1973), and a bias-corrected version, Aicc (Sugiura, 1978; Hurvich & Tsai, 1989) are two methods for selection of regression and autoregressive models. Both criteria may be viewed as estimators of the expected Kullback-Leibler information. The bias of AIC and AICC is studied in the underfitting case, where none of the candidate models includes the t...
متن کاملAsymptotic bootstrap corrections of AIC for linear regression models
The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback–Leibler information between the model generating the data and the approximating candidate model. In this paper, two new corrected variants of AIC are derived for the purpose of small sample linear...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005